-
Notifications
You must be signed in to change notification settings - Fork 121
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add drop_nulls
, fill_null
, and filter
Spark Expressions
#1802
base: main
Are you sure you want to change the base?
feat: add drop_nulls
, fill_null
, and filter
Spark Expressions
#1802
Conversation
thanks for your pr, really appreciate it! to be honest i'm not totally sold on having expressions which change length but don't aggregate be supported in the lazy api we may be able to support them, but only if they're followed by an aggregations, so they're kinda like the equivalent of sql's
Will get back to you on this anyway, but for now we may need to punt on these methods, really sorry |
No worries - kinda felt like a cheat move anyway so not torn to see it punted. |
What type of PR is this? (check all applicable)
Related issues
Checklist
If you have comments or can explain your changes, please do so below
Added support for the following methods - made sure to remove all pyspark constructor checks in the main test suite:
drop_nulls
fill_null
nw.Expr.fill_null
are "forward" and "backward" - are there plans to add more strategies (possibly in v2)?filter
(work in progress)pyspark.sql.function.filter
expects the predicate to return aColumn
; however, it raises an error since_from_call
returnsSparkLikeExpr
).